11 research outputs found

    Toward Contention Analysis for Parallel Executing Real-Time Tasks

    Get PDF
    In measurement-based probabilistic timing analysis, the execution conditions imposed to tasks as measurement scenarios, have a strong impact to the worst-case execution time estimates. The scenarios and their effects on the task execution behavior have to be deeply investigated. The aim has to be to identify and to guarantee the scenarios that lead to the maximum measurements, i.e. the worst-case scenarios, and use them to assure the worst-case execution time estimates. We propose a contention analysis in order to identify the worst contentions that a task can suffer from concurrent executions. The work focuses on the interferences on shared resources (cache memories and memory buses) from parallel executions in multi-core real-time systems. Our approach consists of searching for possible task contenders for parallel executions, modeling their contentiousness, and classifying the measurement scenarios accordingly. We identify the most contentious ones and their worst-case effects on task execution times. The measurement-based probabilistic timing analysis is then used to verify the analysis proposed, qualify the scenarios with contentiousness, and compare them. A parallel execution simulator for multi-core real-time system is developed and used for validating our framework. The framework applies heuristics and assumptions that simplify the system behavior. It represents a first step for developing a complete approach which would be able to guarantee the worst-case behavior

    On the Representativity of Execution Time Measurements: Studying Dependence and Multi-Mode Tasks

    Get PDF
    The Measurement-Based Probabilistic Timing Analysis (MBPTA) infers probabilistic Worst-Case Execution Time (pWCET) estimates from measurements of tasks execution times; the Extreme Value Theory (EVT) is the statistical tool that MBPTA applies for inferring worst-cases from observations/measurements of the actual task behavior. MBPTA and EVT capability of estimating safe/pessimistic pWCET rely on the quality of the measurements; in particular, execution time measurements have to be representative of the actual system execution conditions and have to cover multiple possible execution conditions. In this work, we investigate statistical dependences between execution time measurements and tasks with multiple runtime operational modes. In the first case, we outline the effects of dependences on the EVT applicability as well as on the quality of the pWCET estimates. In the second case, we propose the best approaches to account for the different task execution modes and guaranteeing safe pWCET estimates that cover them all. The solutions proposed are validated with test cases

    Architectural performance analysis of FPGA synthesized LEON processors

    Get PDF
    Current processors have gone through multiple internal opti- mization to speed-up the average execution time e.g. pipelines, branch prediction. Besides, internal communication mechanisms and shared resources like caches or buses have a sig- nificant impact on Worst-Case Execution Times (WCETs). Having an accurate estimate of a WCET is now a challenge. Probabilistic approaches provide a viable alternative to single WCET estimation. They consider WCET as a probabilistic distribution associated to uncertainty or risk. In this paper, we present synthetic benchmarks and associated analysis for several LEON3 configurations on FPGA targets. Benchmarking exposes key parameters to execution time variability allowing for accurate probabilistic modeling of system dynamics. We analyze the impact of architecture- level configurations on average and worst-case behaviors

    Study of the extreme value theory applicability for reliable and robust probabilistic worst-case execution time estimates

    No full text
    Dans les systèmes informatiques temps réel, les tâches logicielles sont contraintes par le temps. Pour garantir la sûreté du système critique contrôlé par le système temps réel, il est primordial d'estimer de manière sûre le pire temps d'exécution de chaque tâche. Les performances des processeurs actuels du commerce permettent de réduire en moyenne le temps d'exécution des tâches, mais la complexité des composants d'optimisation de la plateforme rendent difficile l'estimation du pire temps d'exécution. Il existe différentes approches d'estimation du pire temps d'exécution, souvent ségréguées et difficilement généralisables ou au prix de modèles coûteux. Les approches probabilistes basées mesures existantes sont vues comme étant rapides et simples à mettre en œuvre, mais souffrent d'un manque de systématisme et de confiance dans les estimations qu'elles fournissent. Les travaux de cette thèse étudient les conditions d'application de la théorie des valeurs extrêmes à une suite de mesures de temps d'exécution pour l'estimation du pire temps d'exécution probabiliste, et ont été implémentées dans l'outil diagxtrm. Les capacités et les limites de l'outil ont été étudiées grâce à diverses suites de mesures issues de systèmes temps réel différents. Enfin, des méthodes sont proposées pour déterminer les conditions de mesure propices à l'application de la théorie des valeurs extrêmes et donner davantage de confiance dans les estimations.Software tasks are time constrained in real time computing systems. To ensure the safety of the critical systems that embeds the real time system, it is of paramount importance to safely estimate the worst-case execution time of each task. Modern commercial processors optimisation components enable to reduce in average the task execution time at the cost of a hard to determine task worst-case execution time. Many approaches for executing a task worst-case execution time exist but are usually segregated and hardly scalable, or by building very complex models. Measurement-based probabilistic timing analysis approaches are said to be easy and fast, but they suffer from a lack of systematism and confidence in their estimates. This thesis studies the applicability of the extreme value theory to a sequence of execution time measurements for the estimation of the probabilistic worst-case execution time, leading to the development of the diagxtrm tool. Thanks to a large panel of sequences of measurements from different real time systems, capabilities and limits of the tool are enlightened. Finally, a couple of methods are provided for determining measurements conditions that foster the application of the theory and raise more confidence in the estimates

    Etude de l’application de la théorie des valeurs extrèmes pour l’estimation fiable et robuste du pire temps d’exécution probabiliste.

    No full text
    Software tasks are time constrained in real time computing systems. To ensure the safety of the critical systems that embeds the real time system, it is of paramount importance to safely estimate the worst-case execution time of each task. Modern commercial processors optimisation components enable to reduce in average the task execution time at the cost of a hard to determine task worst-case execution time.Many approaches for executing a task worst-case execution time exist but are usually segre-gated and hardly scalable, or by building very complex models. Measurement-based probabilistic timing analysis approaches are said to be easy and fast, but they suer from a lack of systematism and confidence in their estimates.This thesis studies the applicability of the extreme value theory to a sequence of execution time measurements for the estimation of the probabilistic worst-case execution time, leading to the development of the diagXtrm tool. Thanks to a large panel of sequences of measurements from dierent real time systems, capabilities and limits of the tool are enlightened. Finally, a couple of methods are provided for determining measurements conditions that foster the appli-cation of the theory and raise more confidence in the estimates.Dans les systèmes informatiques temps réel, les tâches logicielles sont contraintes par le temps. Pour garantir la sûreté du système critique controlé par le système temps réel, il est primordial d'estimer de manière sûre le pire temps d'exécution de chaque tâche. Les performances des processeurs actuels du commerce permettent de réduire en moyenne le temps d'exécution des tâches, mais la complexité des composants d'optimisation de la plateforme rendent difficile l'estimation du pire temps d'exécution. Il existe différentes approches d'estimation du pire temps d'exécution, souvent ségréguées et difficilement généralisables ou au prix de modèles coûteux. Les approches probabilistes basées mesures existantes sont vues comme étant rapides et simples à mettre en œuvre, mais souffrent d'un manque de systématisme et de confiance dans les estimations qu'elles fournissent. Les travaux de cette thèse étudient les conditions d'application de la théorie des valeurs extrêmes à une suite de mesures de temps d'exécution pour l'estimation du pire temps d'exécution probabiliste, et ont été implémentées dans l'outil \diagxtrm. Les capacités et les limites de l'outil ont été étudiées grâce à diverses suites de mesures issues de systèmes temps réel différents. Enfin, des méthodes sont proposées pour déterminer les conditions de mesure propices à l'application de la théorie des valeurs extrêmes et donner davantage de confiance dans les estimations

    On the Reliability of the Probabilistic Worst-Case Execution Time Estimates

    No full text
    International audienceProbabilistic Worst-Case Execution Time estimates, through Measurement-Based Probabilistic Timing Analyses and statistical inference, desperately need for formal definition and reliability.The automatic DIAGnostic tool for applying the eXTReMe value theory within the Probabilistic Timing Analysis framework we are proposing defines a complete set of statistical tests for studying execution time traces, e.g., the real-time task average execution behavior, and estimating the extreme behavior of the task execution time, in particular the probabilistic Worst-Case Execution Time.The tool allows also defining and evaluating the reliability of the probabilistic Worst-Case Execution Time estimates with the Extreme Value Theory by applying a fuzzy logic approach.We apply the tool to traces of execution time measurements of a task running on a Commercial off-the-shelf real-time multi-core system under different execution conditions. Application of the diagnostic tool to the traces of execution time measurements particularly validates the hypothesis of using the Extreme Value Theory for estimating the probabilistic Worst-Case Execution Time for this kind of system

    Measurement-Based Probabilistic Timing Analysis for Graphics Pro cessor Units Kostiantyn Berezovskyi Measurement-Based Probabilistic Timing Analysis for Measurement-Based Probabilistic Timing Analysis for Graphics Pro cessor Units Measurement-Based Probab

    No full text
    Abstract Purely analytical worst-case execution time (WCET) estimation approaches for Graphics Processor Units (GPUs) cannot go farbecause of insufficient public information for the hardware. Thereforemeasurement-based probabilistic timing analysis (MBPTA) seems theway forward. We recently demonstrated MBPTA for GPUs, based onExtreme Value Theory (EVT) of the "Block Maxima" paradigm. In thisnewer work, we formulate and experimentally evaluate a more robustMBPTA approach based on the EVT "Peak over Threshold" paradigmwith a complete set of tests for verifying EVT applicability. It optimallyselects parameters to best-fit the input measurements for more accurateprobabilistic WCET estimates. Different system configuration parameters (cache arrangements, thread block size) and their effect on thepWCET are considered, enhancing models of worst-case GPU behavior. Measurement-Based Probabilistic Timing Analysis for Graphics Processor Units Abstract. Purely analytical worst-case execution time (WCET) estimation approaches for Graphics Processor Units (GPUs) cannot go far because of insufficient public information for the hardware. Therefore measurement-based probabilistic timing analysis (MBPTA) seems the way forward. We recently demonstrated MBPTA for GPUs, based on Extreme Value Theory (EVT) of the "Block Maxima" paradigm. In this newer work, we formulate and experimentally evaluate a more robust MBPTA approach based on the EVT "Peak over Threshold" paradigm with a complete set of tests for verifying EVT applicability. It optimally selects parameters to best-fit the input measurements for more accurate probabilistic WCET estimates. Different system configuration parameters (cache arrangements, thread block size) and their effect on the pWCET are considered, enhancing models of worst-case GPU behavior

    Measurement-Based Probabilistic Timing Analysis for Graphics Processor Units

    No full text
    Architecture of Computing Systems (ARCS 2016). 4 to 7, Apr, 2016. Nuremberg, Germany.Purely analytical worst-case execution time (WCET) estimation approaches for Graphics Processor Units (GPUs) cannot go far because of insufficient public information for the hardware. Therefore measurement-based probabilistic timing analysis (MBPTA) seems the way forward. We recently demonstrated MBPTA for GPUs, based on Extreme Value Theory (EVT) of the “Block Maxima” paradigm. In this newer work, we formulate and experimentally evaluate a more robust MBPTA approach based on the EVT “Peak over Threshold” paradigm with a complete set of tests for verifying EVT applicability. It optimally selects parameters to best-fit the input measurements for more accurate probabilistic WCET estimates. Different system configuration parameters (cache arrangements, thread block size) and their effect on the pWCET are considered, enhancing models of worst-case GPU behavior.info:eu-repo/semantics/publishedVersio

    Measurement-Based Probabilistic Timing Analysis for Graphics Processor Units

    No full text
    Architecture of Computing Systems (ARCS 2016). 4 to 7, Apr, 2016. Nuremberg, Germany.Purely analytical worst-case execution time (WCET) estimation approaches for Graphics Processor Units (GPUs) cannot go far because of insufficient public information for the hardware. Therefore measurement-based probabilistic timing analysis (MBPTA) seems the way forward. We recently demonstrated MBPTA for GPUs, based on Extreme Value Theory (EVT) of the “Block Maxima” paradigm. In this newer work, we formulate and experimentally evaluate a more robust MBPTA approach based on the EVT “Peak over Threshold” paradigm with a complete set of tests for verifying EVT applicability. It optimally selects parameters to best-fit the input measurements for more accurate probabilistic WCET estimates. Different system configuration parameters (cache arrangements, thread block size) and their effect on the pWCET are considered, enhancing models of worst-case GPU behavior.info:eu-repo/semantics/publishedVersio

    Analyse architecturale des performances des Processeurs LEON synthétisées sur FPGA

    No full text
    International audience; Modern embedded processors have gone through multiple internal optimization to speed-up the average execution time e.g., caches, pipelines, branch prediction. Besides, internal communication mechanisms and shared resources like caches or buses have a significant impact on Worst-Case Execution Times (WCETs). Having an accurate estimate of a WCET is now a challenge. Probabilistic approaches provide a viable alternative to single WCET estimation. They consider WCET as a probabilistic distribution associated to uncertainty or risk.In this paper, we present synthetic benchmarks and associated analysis for several LEON3 configurations on FPGA targets. Benchmarking exposes key parameters to execution time variability allowing for accurate probabilistic modeling of system dynamics. We analyze the impact of architecture-level configurations on average and worst-case behaviors.; Les processeurs embarqués modernes sont passés par optimisation interne multiple pour accélérer l'exécution moyenne par exemple le temps, les caches, les pipelines, la prédiction de branchement. Par ailleurs, mécanismes de communication et de partage des ressources comme des caches ou bus ont un impact significatif sur le temps d'exécution pire des cas (WCETs). Avoir une estimation précise d'un WCET est maintenant en défi. Les approches probabilistes fournies pour des alternatives viables à Singles WCET estimation. Ils considèrent comme un WCET probabiliste la distribution associée à l'incertitude ou le risque du pire cas.Dans cet article, nous présentons des benchmarks synthétiques et associés analyse pour plusieurs configurations LEON3 sur FPGA cibles. Benchmarking expose l'effet de certains paramètres sur la variabilité de l'exécution des taches afin de permettre à la modélisation probabiliste précise de la dynamique du système. Nous analysons l'impact des configurations sur les comportements moyens et pire cas du systeme
    corecore